278 research outputs found

    The Open Grid Computing Environments collaboration: portlets and services for science gateways

    Full text link
    We review the efforts of the Open Grid Computing Environments collaboration. By adopting a general three-tiered architecture based on common standards for portlets and Grid Web services, we can deliver numerous capabilities to science gateways from our diverse constituent efforts. In this paper, we discuss our support for standards-based Grid portlets using the Velocity development environment. Our Grid portlets are based on abstraction layers provided by the Java CoG kit, which hide the differences of different Grid toolkits. Sophisticated services are decoupled from the portal container using Web service strategies. We describe advance information, semantic data, collaboration, and science application services developed by our consortium. Copyright © 2006 John Wiley & Sons, Ltd.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/56029/1/1078_ftp.pd

    Accelerating time to scientific discovery with a grid-enhanced microsoft project

    Get PDF
    The composition, execution, and monitoring of challenging scientific applications is often a complex affair. To cope with the issue of workflow management, several tools and frameworks have been designed and put into use. However, the entry barrier to using these tools productively is high, and may hinder the progress of many scientists, or nonexperts, that develop workflows infrequently. As part of our Cyberaide framework we enable workflow definition, execution and monitoring through the Microsoft Project software package. The motivation for this choice is that many scientists are already familiar with Microsoft Project, a project management software package that is perceived to be user friendly. Through our tool we have the ability to seamlessly access Grids, such as the NSF sponsored TeraGrid. Cyberaide abstractions have also the potential to allow integration with other resources, including Microsoft HPC clusters. We test our hypothesis of usability while evaluating the tool as part of several graduate level courses taught in the field of Grid and Cloud computing

    A cumulus project: design and implementation

    Get PDF
    The Cloud computing becomes an innovative computing paradigm, which aims to provide reliable, customized and QoS guaranteed computing infrastructures for users. This paper presents our early experience of Cloud computing based on the Cumulus project for compute centers. In this paper, we introduce the Cumulus project with its various aspects, such as design pattern, infrastructure, and middleware

    Cloud Computing: A Perspective Study

    Get PDF
    The Cloud computing emerges as a new computing paradigm which aims to provide reliable, customized and QoS guaranteed dynamic computing environments for end-users. In this paper, we study the Cloud computing paradigm from various aspects, such as definitions, distinct features, and enabling technologies. This paper brings an introductional review on the Cloud computing and provide the state-of-the-art of Cloud computing technologies

    09131 Abstracts Collection -- Service Level Agreements in Grids

    Get PDF
    From 22.03. to 27.03.09, the Dagstuhl Seminar 09131 ``Service Level Agreements in Grids \u27\u27 was held in Schloss Dagstuhl~--~Leibniz Center for Informatics. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available

    Active thermo chemical tables: thermochemistry for the 21st century

    Get PDF
    Active Thermochemical Tables (ATcT) are a good example of a significant breakthrough in chemical science that is directly enabled by the US DOE SciDAC initiative. ATcT is a new paradigm of how to obtain accurate, reliable, and internally consistent thermochemistry and overcome the limitations that are intrinsic to the traditional sequential approach to thermochemistry. The availability of high-quality consistent thermochemical values is critical in many areas of chemistry, including the development of realistic predictive models of complex chemical environments such as combustion or the atmosphere, or development and improvement of sophisticated high-fidelity electronic structure computational treatments. As opposed to the traditional sequential evolution of thermochemical values for the chemical species of interest, ATcT utilizes the Thermochemical Network (TN) approach. This approach explicitly exposes the maze of inherent interdependencies normally ignored by the conventional treatment, and allows, inter alia, a statistical analysis of the individual measurements that define the TN. The end result is the extraction of the best possible thermochemistry, based on optimal use of all the currently available knowledge, hence making conventional tabulations of thermochemical values obsolete. Moreover, ATcT offer a number of additional features that are neither present nor possible in the traditional approach. With ATcT, new knowledge can be painlessly propagated through all affected thermochemical values. ATcT also allows hypothesis testing and evaluation, as well as discovery of weak links in the TN. The latter provides pointers to new experimental or theoretical determinations that can most efficiently improve the underlying thermochemical body of knowledge.Active Thermochemical Tables (ATcT) are a good example of a significant breakthrough in chemical science that is directly enabled by the US DOE SciDAC initiative. ATcT is a new paradigm of how to obtain accurate, reliable, and internally consistent thermochemistry and overcome the limitations that are intrinsic to the traditional sequential approach to thermochemistry. The availability of high-quality consistent thermochemical values is critical in many areas of chemistry, including the development of realistic predictive models of complex chemical environments such as combustion or the atmosphere, or development and improvement of sophisticated high-fidelity electronic structure computational treatments. As opposed to the traditional sequential evolution of thermochemical values for the chemical species of interest, ATcT utilizes the Thermochemical Network (TN) approach. This approach explicitly exposes the maze of inherent interdependencies normally ignored by the conventional treatment, and allows, inter alia, a statistical analysis of the individual measurements that define the TN. The end result is the extraction of the best possible thermochemistry, based on optimal use of all the currently available knowledge, hence making conventional tabulations of thermochemical values obsolete. Moreover, ATcT offer a number of additional features that are neither present nor possible in the traditional approach. With ATcT, new knowledge can be painlessly propagated through all affected thermochemical values. ATcT also allows hypothesis testing and evaluation, as well as discovery of weak links in the TN. The latter provides pointers to new experimental or theoretical determinations that can most efficiently improve the underlying thermochemical body of knowledge

    Grid Virtualization Engine: Design, Implementation, and Evaluation

    Full text link

    Whitepaper on Reusable Hybrid and Multi-Cloud Analytics Service Framework

    Full text link
    Over the last several years, the computation landscape for conducting data analytics has completely changed. While in the past, a lot of the activities have been undertaken in isolation by companies, and research institutions, today's infrastructure constitutes a wealth of services offered by a variety of providers that offer opportunities for reuse, and interactions while leveraging service collaboration, and service cooperation. This document focuses on expanding analytics services to develop a framework for reusable hybrid multi-service data analytics. It includes (a) a short technology review that explicitly targets the intersection of hybrid multi-provider analytics services, (b) a small motivation based on use cases we looked at, (c) enhancing the concepts of services to showcase how hybrid, as well as multi-provider services can be integrated and reused via the proposed framework, (d) address analytics service composition, and (e) integrate container technologies to achieve state-of-the-art analytics service deploymen
    corecore